Is statistical learning trainable?

نویسندگان

  • Luca Onnis
  • Matthew Lou-Magnuson
  • Hongoak Yun
  • Erik D. Thiessen
چکیده

Statistical learning (SL) is the ability to implicitly extract regularities in the environment, and likely supports various higher-order behaviors, from language to music and vision. While specific patterns experience are likely to influence SL outcomes, this ability is tacitly conceptualized as a fixed construct, and few studies to date have investigated how experience may shape statistical learning. We report one experiment that directly tested whether SL can be modulated by previous experience. We used a prepost treatment design allowing us to pinpoint what specific aspects of “previous experience” matter for SL. The results show that performance on an artificial grammar learning task at post-test depends on whether the grammar to be learned at post-test matches the underlying grammar structures learned during treatment. Our study is the first to adopt a pre-post test design to directly modulate the effects of learning on learning itself.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Trainable Visual Models for Object Class Recognition

Recognizing object classes, such as cars, planes or elephants, in an image or a video remains one of the most challenging problems in Computer Vision. However, recently a number of successes have been achieved by using ideas and algorithms from statistical learning theory, where visual models are trained using positive and negative examples of the class.

متن کامل

A Trainable Tokenizer, solution for multilingual texts and compound expression tokenization

Tokenization is one of the initial steps done for almost any text processing task. It is not particularly recognized as a challenging task for English monolingual systems but it rapidly increases in complexity for systems that apply it for different languages. This article proposes a supervised learning approach to perform the tokenization task. The method presented in this article is based on ...

متن کامل

Trainable fusion rules. II. Small sample-size effects

Profound theoretical analysis is performed of small-sample properties of trainable fusion rules to determine in which situations neural network ensembles can improve or degrade classification results. We consider small sample effects, specific only to multiple classifiers system design in the two-category case of two important fusion rules: (1) linear weighted average (weighted voting), realize...

متن کامل

Phrase-based Statistical Language Generation using Graphical Models and Active Learning

Most previous work on trainable language generation has focused on two paradigms: (a) using a statistical model to rank a set of generated utterances, or (b) using statistics to inform the generation decision process. Both approaches rely on the existence of a handcrafted generator, which limits their scalability to new domains. This paper presents BAGEL, a statistical language generator which ...

متن کامل

A Trainable Method For Extracting Chinese Entity Names And Their Relations

In this paper we propose a trainable method for extracting Chinese entity names and their relations. We view the entire problem as series of classification problems and employ memory-based learning (MBL) to resolve them. Preliminary results show that this method is efficient, flexible and promising to achieve better performance than other existing methods.

متن کامل

استفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی

This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015